Supplementary Material: Learning Affinity via Spatial Propagation Networks
ثبت نشده
چکیده
Theorem 1 (i.e., Theorem 3 in the paper) shows that the stability of a linear propagation model can 8 be maintained by regularizing all the weights of each pixel in the hidden layer such the summation 9 of their absolute values is less than one. For the one-way connection, Chen et al. [1] maintain each 10 scalar output p to be within (0, 1). Liu et al. [4] extend the range to (−1, 1), where the negative 11 weights show preferable effects for learning image enhancers. This indicates that the affinity matrix 12 is not necessarily restricted to be positive/semi-positive definite. (e.g., this setting is also used for 13 a pre-defined affinity matrix in [3].) For the three-way connection, we simply regularize the three 14 weights (the output of a deep CNN) according to Theorem 1 without any positive/semi-positive 15 definite restriction. 16
منابع مشابه
Learning Affinity via Spatial Propagation Networks
In this paper, we propose spatial propagation networks for learning the affinity matrix for vision tasks. We show that by constructing a row/column linear propagation model, the spatially varying transformation matrix exactly constitutes an affinity matrix that models dense, global pairwise relationships of an image. Specifically, we develop a three-way connection for the linear propagation mod...
متن کاملSupplementary Material for Video Propagation Networks
In this supplementary, we present experiment protocols and additional qualitative results for experiments on video object segmentation, semantic video segmentation and video color propagation. Table 1 shows the feature scales and other parameters used in different experiments. Figures 1, 2 show some qualitative results on video object segmentation with some failure cases in Fig. 3. Figure 4 sho...
متن کاملSupplementary Material: Unsupervised learning of object landmarks by factorized spatial embeddings
In this supplementary material we elaborate on several details regarding the experimental setup, provide an additional comparison with training a supervised network on small numbers of images and present numerous images giving a qualitative look at the performance of our method. It is organized as follows: Sec. 2 gives the additional details and hyperparameters, Sec. 3 compares quantitatively w...
متن کاملSupplementary Material to “ Cooled and Relaxed Survey Propagation for MRFs ” Hai
This is the supplementary material for the submission to NIPS 2007, entitled “Cooled and Relaxed Survey Propagation for MRFs”. The purpose of this material is to prove the update equations of Relaxed Survey Propagation (RSP) in the main paper.
متن کاملGraph-based Isometry Invariant Representation Learning: Supplementary material
We use supervised learning and train our network so that it maximizes the log-probability of estimating the correct class of training samples via logistic regression. Overall, we need to compute the values of the parameters in each convolutional and in fully-connected layers. The other layers do not have any parameter to be estimated. We train the network using a classical back-propagation algo...
متن کامل